Distributed Bayes Blocks for Variational Bayesian Learning

نویسنده

  • Antti Honkela
چکیده

In this work preliminary results on a distributed version of Bayes Blocks software library [1, 4] are presented. Bayes Blocks is a software implementation of the variational Bayesian building block framework [7] that allows automated derivation of variational learning procedures for a variety of models, including nonlinear and variance models. The library is implemented in C++ with Python bindings. The underlying framework resembles the variational message passing (VMP) framework [8], but includes support for nonlinear and other models that are not in the conjugate exponential family.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayes Blocks: An Implementation of the Variational Bayesian Building Blocks Framework

A software library for constructing and learning probabilistic models is presented. The library offers a set of building blocks from which a large variety of static and dynamic models can be built. These include hierarchical models for variances of other variables and many nonlinear models. The underlying variational Bayesian machinery, providing for fast and robust estimation but being mathema...

متن کامل

On Variational Bayes Algorithms for Exponential Family Mixtures

In this paper, we empirically analyze the behaviors of the Variational Bayes algorithm for the mixture model. While the Variational Bayesian learning has provided computational tractability and good generalization performance in many applications, little has been done to investigate its properties. Recently, the stochastic complexity of mixture models in the Variational Bayesian learning was cl...

متن کامل

Algorithmic improvements for variational inference

Variational methods for approximate inference in machine learning often adapt a parametric probability distribution to optimize a given objective function. This view is especially useful when applying variational Bayes (VB) to models outside the conjugate-exponential family. For them, variational Bayesian expectation maximization (VB EM) algorithms are not easily available, and gradient-based m...

متن کامل

An Alternative View of Variational Bayes and Minimum Variational Stochastic Complexity

Bayesian learning is widely used in many applied datamodelling problems and is often accompanied with approximation schemes since it requires intractable computation of the posterior distributions. In this study, we focus on the two approximation methods, the variational Bayes and the local variational approximation. We show that the variational Bayes approach for statistical models with latent...

متن کامل

D-MFVI: Distributed Mean Field Variational Inference using Bregman ADMM

Bayesian models provide a framework for probabilistic modelling of complex datasets. However, many of such models are computationally demanding especially in the presence of large datasets. On the other hand, in sensor network applications, statistical (Bayesian) parameter estimation usually needs distributed algorithms, in which both data and computation are distributed across the nodes of the...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006